# Southeast Asian Language Processing
Burmesebert
Burmese-Bert is a bilingual masked language model based on bert-large-uncased, supporting both English and Burmese.
Large Language Model
Transformers Supports Multiple Languages

B
jojo-ai-mst
20
2
Javanese Bert Small
MIT
A Javanese masked language model based on the BERT architecture, fine-tuned on Javanese Wikipedia data
Large Language Model Other
J
w11wo
22
0
Sundanese Roberta Base
MIT
A Sundanese masked language model based on the RoBERTa architecture, trained on multiple datasets.
Large Language Model Other
S
w11wo
32
2
Roberta Base Indonesian 1.5G Sentiment Analysis Smsa
A sentiment analysis model fine-tuned on the Indonesian RoBERTa model, achieving 92.6% accuracy on the indonlu dataset
Text Classification
Transformers Other

R
ayameRushia
44
4
Melayubert
MIT
A Malay masked language model based on the BERT architecture, trained on the Malay subset of the OSCAR dataset, supporting PyTorch and TensorFlow frameworks.
Large Language Model
Transformers Other

M
StevenLimcorn
15
0
Indobert Base Uncased Finetuned Indonlu Smsa
MIT
This model is a text classification model fine-tuned on the indonlu dataset based on indobert-base-uncased, specifically designed for Indonesian language sentiment analysis tasks.
Text Classification
Transformers Other

I
ayameRushia
31
0
Featured Recommended AI Models